Release Notes for Q2 2024
Explore the new features and enhancements added to this update!
Updated in: July 2024
Release Version: 1.18
Feature/ Enhancement | Description | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Support for Salesforce integration with Amazon AppFlow to ingest data into a Snowflake table | With this release, we now support ingesting Salesforce data into a Snowflake table by using Amazon AppFlow. You can add tags to the AppFlow objects that are stored in AWS as a key-value pair. Tags help you to manage, organize, search, and filter resources. See Data Ingestion from Salesforce to Snowflake using Amazon AppFlow. | ||||||||||||||||||||
Support for creating branch templates for custom code |
With this release, we now support creating branch templates in the source code repository for custom code for transformation jobs created in Databricks and Snowflake. You can promote and deploy a data pipeline to the required branch, depending on the defined branching structure. |
||||||||||||||||||||
Filtering and preview of data is now enabled in data crawlers and data catalogs |
With this release, you can now apply specific conditions to table columns during data crawling to filter data according to your needs. Afterward, you can create a catalog with the filtered data and preview it in the Data Crawler Preview. You can apply additional filters on the catalog that is created and view it in the Data Catalog Preview. You can use the customized catalogs in data pipelines. |
||||||||||||||||||||
Support for triggering a pipeline by listening to notifications from Amazon SQS events |
Data Pipeline Studio now supports triggering a pipeline by listening to notifications from Amazon SQS events. The pipeline consumes SQS events from AWS and triggers a pipeline run, or terminates a pipeline run based on the notifications. |
||||||||||||||||||||
Renaming of steps in Snowflake and Databricks data integration, data transformation, and data quality jobs |
The following steps in data integration, data transformation, and data quality jobs have been renamed:
|
||||||||||||||||||||
Snowflake Partner ID implementation |
With this release, the Snowflake Partner Program telemetry is enabled in the Lazsa Platform. The default toggle is “ON” which can be disabled manually. |
||||||||||||||||||||
Databricks Wheel Package upgrade available |
The Lazsa Platform periodically upgrades the Databricks wheel package with an indication on the Data Pipeline Studio UI that an upgrade is available. In this release, the wheel package is upgraded to the following version:
|
||||||||||||||||||||
Support for the latest versions of technologies |
Support for the following latest versions of technologies is now available in the Lazsa Platform:
To view all the supported tools and technologies, see Tools and Technologies Integrated with Lazsa Platform. |
||||||||||||||||||||
User provisioning support for GitLab | With this release, you can now provision users into GitLab repositories from within the Lazsa Platform. To provide GitLab access to any new users, use a system-defined Product role or create a custom Product role, map the required GitLab project permissions to the role, and assign it to the intended users in your product team in Lazsa. | ||||||||||||||||||||
New version of Lazsa Orchestrator Agent available | A new version (1.1.84) of the Lazsa Orchestrator Agent with simplified installation steps is now available. Manual steps like creating a namespace in an EKS or AKS cluster, attaching the server certificate to the Ingress object, and creating a Docker registry secret are now included in the installation command. Upgrade to the latest version for a seamless experience. For upgrade steps, see Updating Lazsa Orchestrator Agent . |